Learning to Recognize Daily Actions Using Gaze

نویسندگان

  • Alireza Fathi
  • Yin Li
  • James M. Rehg
چکیده

We present a probabilistic generative model for simultaneously recognizing daily actions and predicting gaze locations in videos recorded from an egocentric camera. We focus on activities requiring eye-hand coordination and model the spatio-temporal relationship between the gaze point, the scene objects, and the action label. Our model captures the fact that the distribution of both visual features and object occurrences in the vicinity of the gaze point is correlated with the verb-object pair describing the action. It explicitly incorporates known properties of gaze behavior from the psychology literature, such as the temporal delay between fixation and manipulation events. We present an inference method that can predict the best sequence of gaze locations and the associated action label from an input sequence of images. We demonstrate improvements in action recognition rates and gaze prediction accuracy relative to state-of-the-art methods, on two new datasets that contain egocentric videos of daily activities and gaze.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exploiting Three-Dimensional Gaze Tracking for Action Recognition During Bimanual Manipulation to Enhance Human–Robot Collaboration

Human–robot collaboration could be advanced by facilitating the intuitive, gaze-based control of robots, and enabling robots to recognize human actions, infer human intent, and plan actions that support human goals. Traditionally, gaze tracking approaches to action recognition have relied upon computer vision-based analyses of two-dimensional egocentric camera videos. The objective of this stud...

متن کامل

Direct Gaze May Modulate Face Recognition in Newborns

Faces are important for non-verbal communication in daily life, and eye gaze direction provides important information for adult–infant interaction. Four-month-old infants and adults better recognize faces when accompanied with direct gaze, suggesting a special status of ‘eye contact’. Whether mutual gaze plays a role in face recognition from birth, or whether it requires expertise, is investiga...

متن کامل

How to Detect a Loss of Attention in a Tutoring System using Facial Expressions and Gaze Direction

Tutoring systems try to help students with certain topics, but unlike a normal teacher it is not very easy to notice when a student is distracted. It is possible to detect the gaze direction to detect a loss of attention, but this might not be enough. This paper describes all the parts needed for a detection system that uses the detection of facial expressions and gaze direction to recognize if...

متن کامل

How we look tells us what we do: Action recognition using human gaze.

Can a person's interpretation of a scene, as reflected in their gaze patterns, be harnessed to recognize different classes of actions? Behavioral data were acquired from a previous study in which participants (n=8) saw 500 images from the PASCAL VOC 2012 Actions image set. Each image was freely viewed for 3 seconds and was followed by a 10-AFC test in which the depicted human action had to be s...

متن کامل

Comparing and Combining Eye Gaze and Interface Actions for Determining User Learning with an Interactive Simulation

This paper presents an experimental evaluation of eye gaze data as a source for modeling user’s learning in Interactive Simulations (IS). We compare the performance of classifier user models trained only on gaze data vs. models trained only on interface actions vs. models trained on the combination of these two sources of user interaction data. Our long-term goal is to build user models that ca...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012